June 2001 |
Distributed Web Applications and Microsoft
.NET Laurence Press, Ph.D. Driven by constant technological progress, business data processing systems have evolved through four generations: batch processing, timesharing with simple terminals, client-server systems where the client is a personal computer running a custom program, and Web-based systems in which the client is a personal computer running a Web browser. (See the sidebar on these generations.) Are we on the dawn of a new paradigm, distributed Web applications?
Last year, Microsoft began talking about their forthcoming .NET initiative for distributed Web applications, in which Web browsers (with extensions) will access information on distributed Web sites, and combine it into an integrated application. Thus, the user may seem to be working with one remote server, but in fact, may be accessing information and applications from several machines and organizations (generating revenue for them all). Bill Gates and other Microsoft executives state that .NET will be as important to the company as DOS, Windows and the Internet were -- they say they are betting the company on it. We already have examples of distributed applications on the Web. One of my favorites is trip.com, which repackages real-time flight information from the air traffic control system. Their Web site displays the same flight-progress maps, flight speed, expected time of arrival, etc. that is available to passengers on planes. Subscribers to the Consumer Reports Web site gain transparent access to the product-information database maintained by Active Buyers Guide. Many companies import current and historical information on their stock value for display on their Web sites, and Moreover.com enables you to include news feeds on dozens of topics on your Web site. As useful as these applications are, they are all ad hoc. The information providers have each developed interfaces to access information and processes on their servers. This ad hoc development is inefficient, so there has been a groundswell of activity around three standards to systematize the process: XML (Extensible Markup Language), SOAP (Simple Object Access Protocol), and UDDI (Universal Description, Discovery, and Integration). They are used in .NET and in competing, less well-publicized initiatives from IBM and Sun.[1] .NET seems to be somewhat ahead of the other initiatives,[2] and Microsoft has demonstrated applications developed with pre-release tools and running on pre-release servers at several conferences. One of their demonstrations shows a travel agency service that automatically sends a message to a customer's cell phone an hour before their flight is scheduled to depart or if it is delayed. Another message is sent to the traveler's secretary an hour before the plane lands. The customer sets all this up with a single visit to the travel agency site, which accesses the traveler's address book (to get the secretary's email address) from one server, flight schedules from the airline server, and in-flight status from the air-traffic control system. XML is the best established and supported of the three distributed Web standards. It allows us to define markup tags to make the structure of a document explicit. Thus we could define a <price> tag for a document containing information on a stock quote or an <author> tag for a document describing a book. The definitions of the tags are up to the application developer; hence, the name "extensible" markup language.[3] A client program receiving an XML document would be able to find the appropriate information and present it appropriately, and a server program would be able to parse a request and respond appropriately. The request to a server would be handled by SOAP, which makes a remote service request, passing parameters and receiving results. UDDI would be used to advertise and discover services on the network. Summarized in a paragraph, this sounds simple, but there are many details to be worked out. Furthermore, if we are to avoid creating islands of incompatible content, these standards must truly remain standard -- let's hope Microsoft, Sun and IBM would rather share a large pie than each own three small ones. The widespread interest in distributed applications may be due to the poor showing of the "build it and they will come" dot-com companies. People are looking for viable business models, and reselling their information and services might be profitable. Take the Ebay auction site for example. At one point, they were upset that others were taking proprietary information from their site. They threatened legal action against such "screen scrapers" and altered their site formats to foil them. Today, Ebay encourages republication of their auctions. They have created an application program interface and have implemented a wizard enabling others to mirror Ebay auctions that are relevant to their Web sites. (Ebay was one of the first large companies to announce support for Microsoft .NET). Republication of news, stock quotes, auctions, and flight information on the Web are valuable applications, but they are only the tip of the iceberg -- applications in business, government and other organizations float below the surface. The same tools and standards we use to share information on Internet sites can be used on intranets and extranets. .NET and its competitors will be used to share information between companies and their suppliers and customers, government agencies and citizens, departments and divisions within organizations, etc. If distributed Web applications are viable today, they will be increasingly attractive as network speed increases. As optical Ethernet rolls out in local, metropolitan and wide area networks, the overhead imposed by distributing an application will diminish. Dramatic increases in network, storage and processing speed allow us to evolve toward a vision of the network as being inside a computer, directly connected to the CPU and memory, rather than as an externally accessed input/output device. In that situation, business considerations, not technology, will determine where an application executes or data is stored. To this point, we have discussed distributed applications and the standards underlying them generically, but Microsoft has gone a step beyond Sun and IBM by announcing HailStorm, a set of .NET services. The HailStorm services are an extension of Microsoft's current Passport service, which stores user profile information, and their instant messaging system. HailStorm will store a user's profile, including credit card, contact and calendar information and the e-mail addresses and other characteristics of friends and colleagues. Having this sort of information available on a server would enable things like convenient one-click shopping and applications like the travel agent example described above in which the system presents the user with a list of colleagues who could be automatically notified when his or her flight arrived. Furthermore, once a user had logged in to his or her client machine, it would no longer be necessary to give passwords at the HailStorm-using sites they visited. It is no surprise that the HailStorm announcement has generated debate. Critics[4] worry that Microsoft would sell user profiles and market research data and HailStorm servers would be tempting targets for hackers. Microsoft counters that they would never sell or look at user information, and would block unauthorized access. They assure us that our data would be safer and more conveniently managed on a HailStorm server than on our own machines, and we would have full control over whom and which programs could access it. I may be naïve, but I am willing to trust my data to a server in exchange for reliability, backup, and the convenience of a single login and more intelligence in my applications. My worry is not that Microsoft will profit from or mishandle my data; my worry is that they will do a good job and succeed. If HailStorm services were to become ubiquitous, we would become dependent upon them, leaving Microsoft, or any other monopoly provider, in a position to gouge away on service and transactions. At the HailStorm announcement press conference, Bill Gates stated, "it's our goal to have virtually everybody who uses the Internet to have one of these Passport connections."[5] Can you imagine a world in which visits to even, say, 10 percent of Internet sites require that you have a valid Microsoft Passport, Microsoft is receiving a fee for every transaction you complete, and you are paying them for passport renewal and additional services? Today I need a U. S. Passport, a couple credit cards, a driver's license, and a Social Security card -- I am not enthusiastic about having to add a HailStorm Passport to that list. I cannot imagine HailStorm working or my participating in it without either robust competition or government control in the personal-profile services market. I want a choice among many profile-service vendors who use open standards to achieve interoperability. Who might compete? The major credit card companies are one possibility -- they have experience with user authentication on a global[6] scale. Novell, which has excellent directory database technology, promised that a similar-sounding service called "Digital Me" would be available on the Internet in June 1999, but I have not heard more about it.[7] AOL must also be thinking along these lines. The other players with their cards in my wallet are the state and federal government -- should they issue our Internet passports? HailStorm seems to incorporate two other Microsoft lock-ins. Applications will use Microsoft's (not AOL's) instant messaging service to communicate with users, for example, telling them when a flight is delayed or their bid is exceeded in an auction. HailStorm also includes storage services like "MyPictures" and "MyDocuments." We are having trouble deciding who controls the Internet domain-name taxonomy today -- do we want to turn the directory structure over to Microsoft tomorrow? Less than a year ago, it seemed Microsoft might be broken up by the Justice Department. Today they are planning to issue cyberspace passports to everyone on Earth and reportedly don't even have a contingency plan for a possible breakup.[8] It remains to be seen whether distributed Web applications become as important as the Microsoft (and Sun and IBM) PR folks would have us believe. Development tools and servers are in beta test, we still have network bottlenecks, standards are evolving and may be defeated by corporate self-interest, deployment will take longer and be more difficult than expected, and the market place or the Justice Department may have reservations about letting Microsoft establish a monopoly over what may one day become a necessity. Still, Microsoft says they are betting the company on the distributed Web paradigm -- do you think they will win the bet? [1] IBM calls their initiative IBM Web Services and Sun's is the Open Network Environment (ONE). Return to text. [2] It is not that Microsoft has invented the standards and architecture upon which .NET is based, but they are quickly gearing up to deploy software using them, and are publicizing the effort. Return to text. [3] Of course, all parties participating in the development of the application must agree on the definitions of the tags to be used. This is technically simple, but often difficult from a business standpoint. Return to text [4] For example, Procomp, a trade association that favors competition (i. e. is anti-Microsoft) has been critical of .NET. See their white paper on .NET at http://www.procompetition.org/headlines/WhitePaper5_15.pdf. Return to text. [5] Remarks by Bill Gates at the HailStorm introduction, Redmond Washington, March 19, 2001, http://www.microsoft.com/billgates/speeches/2001/03-19HailStorm.asp. Return to text. [6] Microsoft Passports are currently available in 26 languages. Return to text. [7] Novell CEO Eric Schmidt described a Digital Me as part of a vision of a customer-centered Internet similar to the Microsoft's in a keynote speech at Network+Interop in 1999. Return to text. [8] Schlender, Brent, The Beast is Back, Fortune, June 11 2001. Return to text.
Released: June 22, 2001 © 2001. Laurence Press. All rights reserved. |